least squares estimate - определение. Что такое least squares estimate
Diclib.com
Словарь ChatGPT
Введите слово или словосочетание на любом языке 👆
Язык:

Перевод и анализ слов искусственным интеллектом ChatGPT

На этой странице Вы можете получить подробный анализ слова или словосочетания, произведенный с помощью лучшей на сегодняшний день технологии искусственного интеллекта:

  • как употребляется слово
  • частота употребления
  • используется оно чаще в устной или письменной речи
  • варианты перевода слова
  • примеры употребления (несколько фраз с переводом)
  • этимология

Что (кто) такое least squares estimate - определение

APPROXIMATION METHOD IN STATISTICS
Method of least squares; Least-squares method; Least-squares estimation; Least-Squares Fitting; Least squares fitting; Sum of Squared Error; Least-squares; Least squares approximation; Least-squares approximation; Least squares method; Least-squares analysis; Least squares fit; Least squares problem; Least-squares problem; LSQF; Principle of least squares; Least-squares fit; Method of Least Squares; Least Squares
  • [[Carl Friedrich Gauss]]
  • "Fanning Out" Effect of Heteroscedasticity
  • 251x251px
  • The result of fitting a set of data points with a quadratic function
  • The residuals are plotted against the corresponding <math>x</math> values. The parabolic shape of the fluctuations about <math>r_i=0</math> indicates a parabolic model is appropriate.
  • Conic fitting a set of points using least-squares approximation
Найдено результатов: 376
Least squares         
The method of least squares is a standard approach in regression analysis to approximate the solution of overdetermined systems (sets of equations in which there are more equations than unknowns) by minimizing the sum of the squares of the residuals (a residual being the difference between an observed value and the fitted value provided by a model) made in the results of each individual equation.
least squares         
¦ noun a method of estimating a quantity or fitting a graph to data so as to minimize the sum of the squares of the differences between the observed values and the estimated values.
Ordinary least squares         
  • Fitted regression
  • Residuals plot
  • [[Scatterplot]] of the data, the relationship is slightly curved but close to linear
  • OLS estimation can be viewed as a projection onto the linear space spanned by the regressors. (Here each of <math>X_1</math> and <math>X_2</math> refers to a column of the data matrix.)
  • [[Okun's law]] in [[macroeconomics]] states that in an economy the [[GDP]] growth should depend linearly on the changes in the unemployment rate. Here the ordinary least squares method is used to construct the regression line describing this law.
METHOD FOR ESTIMATING THE UNKNOWN PARAMETERS IN A LINEAR REGRESSION MODEL
Ordinary least squares regression; Normal equations; Ordinary Least Squares; Ordinary list squares; OLS Regression; Ordinary least square; Standard error of the equation; OLS regression; Ordinary Least Squares Regression; Partitioned regression; Least-squares normal matrix; Large Sample Properties
In statistics, ordinary least squares (OLS) is a type of linear least squares method for choosing the unknown parameters in a linear regression model (with fixed level-one effects of a linear function of a set of explanatory variables) by the principle of least squares: minimizing the sum of the squares of the differences between the observed dependent variable (values of the variable being observed) in the input dataset and the output of the (linear) function of the independent variable.
Least trimmed squares         
Least Trimmed Squares
Least trimmed squares (LTS), or least trimmed sum of squares, is a robust statistical method that fits a function to a set of data whilst not being unduly affected by the presence of outliers. It is one of a number of methods for robust regression.
Least mean squares filter         
  • LMS filter
ALGORITHM
Least mean squares; NLMS; Normalised Least mean squares filter; Normalized Least mean squares filter; Normalized least mean squares filter; Normalised least mean squares filter; LMS filter
Least mean squares (LMS) algorithms are a class of adaptive filter used to mimic a desired filter by finding the filter coefficients that relate to producing the least mean square of the error signal (difference between the desired and the actual signal). It is a stochastic gradient descent method in that the filter is only adapted based on the error at the current time.
Regularized least squares         
Penalized least squares; User:Linafru/sandbox; Draft:Regularized least squares; Regularized least-squares; Least squares regularization; Regularized regression; Penalized estimation; Penalized regression
Regularized least squares (RLS) is a family of methods for solving the least-squares problem while using regularization to further constrain the resulting solution.
Non-linear least squares         
IN STATISTICS, A METHOD USED IN REGRESSION ANALYSIS
NLLS; Nonlinear least squares; Non-linear least-squares estimation; Nonlinear Least Squares; Numerical methods for non-linear least squares
Non-linear least squares is the form of least squares analysis used to fit a set of m observations with a model that is non-linear in n unknown parameters (m ≥ n). It is used in some forms of nonlinear regression.
Iteratively reweighted least squares         
Iteratively Re-weighted Least Squares; IRLS; Iterative weighted least squares; Iteratively re-weighted least squares; IRWLS; Iteratively weighted least squares
The method of iteratively reweighted least squares (IRLS) is used to solve certain optimization problems with objective functions of the form of a p-norm:
Constrained least squares         
Constrained linear least squares; Mixed estimation; Restricted least squares
In constrained least squares one solves a linear least squares problem with an additional constraint on the solution.
Recursive least squares filter         
  • 500px
ADAPTIVE FILTER ALGORITHM FOR DIGITAL SIGNAL PROCESSING
Recursive least squares algorithm; Recursive Least Squares; Recursive least squares; RLS filter; RLS algorithm
Recursive least squares (RLS) is an adaptive filter algorithm that recursively finds the coefficients that minimize a weighted linear least squares cost function relating to the input signals. This approach is in contrast to other algorithms such as the least mean squares (LMS) that aim to reduce the mean square error.

Википедия

Least squares

The method of least squares is a standard approach in regression analysis to approximate the solution of overdetermined systems (sets of equations in which there are more equations than unknowns) by minimizing the sum of the squares of the residuals (a residual being the difference between an observed value and the fitted value provided by a model) made in the results of each individual equation.

The most important application is in data fitting. When the problem has substantial uncertainties in the independent variable (the x variable), then simple regression and least-squares methods have problems; in such cases, the methodology required for fitting errors-in-variables models may be considered instead of that for least squares.

Least squares problems fall into two categories: linear or ordinary least squares and nonlinear least squares, depending on whether or not the residuals are linear in all unknowns. The linear least-squares problem occurs in statistical regression analysis; it has a closed-form solution. The nonlinear problem is usually solved by iterative refinement; at each iteration the system is approximated by a linear one, and thus the core calculation is similar in both cases.

Polynomial least squares describes the variance in a prediction of the dependent variable as a function of the independent variable and the deviations from the fitted curve.

When the observations come from an exponential family with identity as its natural sufficient statistics and mild-conditions are satisfied (e.g. for normal, exponential, Poisson and binomial distributions), standardized least-squares estimates and maximum-likelihood estimates are identical. The method of least squares can also be derived as a method of moments estimator.

The following discussion is mostly presented in terms of linear functions but the use of least squares is valid and practical for more general families of functions. Also, by iteratively applying local quadratic approximation to the likelihood (through the Fisher information), the least-squares method may be used to fit a generalized linear model.

The least-squares method was officially discovered and published by Adrien-Marie Legendre (1805), though it is usually also co-credited to Carl Friedrich Gauss (1795) who contributed significant theoretical advances to the method and may have previously used it in his work.